TSIRM: A two-stage iteration with least-squares residual minimization algorithm to solve large sparse linear and nonlinear systems

نویسندگان

  • Raphaël Couturier
  • Lilia Ziane Khodja
  • Christophe Guyeux
چکیده

In this paper, a two-stage iterative algorithm is proposed to improve the convergence of Krylov based iterative methods, typically those of GMRES variants. The principle of the proposed approach is to build an external iteration over the Krylov method, and to frequently store its current residual (at each GMRES restart for instance). After a given number of outer iterations, a least-squares minimization step is applied on the matrix composed by the saved residuals, in order to compute a better solution and to make new iterations if required. It is proven that the proposal has the same convergence properties as the inner embedded method itself. Several experiments have been performed using the PETSc toolkit (using default parameters in the absence of detail) to solve linear and nonlinear problems. They show good speedups compared to GMRES with up to 16,394 cores with di erent preconditioners.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Solution of Underdetermined Linear Equations by Stagewise Orthogonal Matching Pursuit

Finding the sparsest solution to underdetermined systems of linear equations y = Φx is NP-hard in general. We show here that for systems with ‘typical’/‘random’ Φ, a good approximation to the sparsest solution is obtained by applying a fixed number of standard operations from linear algebra. Our proposal, Stagewise Orthogonal Matching Pursuit (StOMP), successively transforms the signal into a n...

متن کامل

Jacobian-Free Three-Level Trust Region Method for Nonlinear Least Squares Problems

Nonlinear least squares (NLS) problems arise in many applications. The common solvers require to compute and store the corresponding Jacobian matrix explicitly, which is too expensive for large problems. In this paper, we propose an effective Jacobian free method especially for large NLS problems because of the novel combination of using automatic differentiation for J(x)v and J (x)v along with...

متن کامل

Iterative Scaled Trust-Region Learning in Krylov Subspaces via Pearlmutter's Implicit Sparse Hessian-Vector Multiply

The online incremental gradient (or backpropagation) algorithm is widely considered to be the fastest method for solving large-scale neural-network (NN) learning problems. In contrast, we show that an appropriately implemented iterative batch-mode (or block-mode) learning method can be much faster. For example, it is three times faster in the UCI letter classification problem (26 outputs, 16,00...

متن کامل

Exact and approximate solutions of fuzzy LR linear systems: New algorithms using a least squares model and the ABS approach

We present a methodology for characterization and an approach for computing the solutions of fuzzy linear systems with LR fuzzy variables. As solutions, notions of exact and approximate solutions are considered. We transform the fuzzy linear system into a corresponding linear crisp system and a constrained least squares problem. If the corresponding crisp system is incompatible, then the fuzzy ...

متن کامل

A new non-linear least squares algorithm for the seismic inversion problem

We describe the problem of simultaneous determination of hypocentre locations and velocity structure, using data from microearthquake networks, and briefly discuss some of the techniques which have been proposed for its solution. In particular we examine the method of Thurber, as implemented in his program SIMUWL, where the problem is formulated as a non-linear least-squares problem. A new prog...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Comput. Science

دوره 17  شماره 

صفحات  -

تاریخ انتشار 2016